20 research outputs found

    Statistical techniques for the detection and analysis of solar explosive events

    Full text link
    Solar explosive events are commonly explained as small scale magnetic reconnection events, although unambiguous confirmation of this scenario remains elusive due to the lack of spatial resolution and of the statistical analysis of large enough samples of this type of events. In this work, we propose a sound statistical treatment of data cubes consisting of a temporal sequence of long slit spectra of the solar atmosphere. The analysis comprises all the stages from the explosive event detection to its characterization and the subsequent sample study. We have designed two complementary approaches based on the combination of standard statistical techniques (Robust Principal Component Analysis in one approach and wavelet decomposition and Independent Component Analysis in the second) in order to obtain least biased samples. These techniques are implemented in the spirit of letting the data speak for themselves. The analysis is carried out for two spectral lines: the C IV line at 1548.2 angstroms and the Ne VIII line at 770.4 angstroms. We find significant differences between the characteristics of the line profiles emitted in the proximities of two active regions, and in the quiet Sun, most visible in the relative importance of a separate population of red shifted profiles. We also find a higher frequency of explosive events near the active regions, and in the C IV line. The distribution of the explosive events characteristics is interpreted in the light of recent numerical simulations. Finally, we point out several regions of the parameter space where the reconnection model has to be refined in order to explain the observations.Comment: Accepted for publication in Astronomy and Astrophysics (in Section 9. The Sun) on 18/01/2011. 17 pages, 22 Figure

    Properties of ultra-cool dwarfs with Gaia. An assessment of the accuracy for the temperature determination

    Full text link
    We aimed to assess the accuracy of the Gaia teff and logg estimates as derived with current models and observations. We assessed the validity of several inference techniques for deriving the physical parameters of ultra-cool dwarf stars. We used synthetic spectra derived from ultra-cool dwarf models to construct (train) the regression models. We derived the intrinsic uncertainties of the best inference models and assessed their validity by comparing the estimated parameters with the values derived in the bibliography for a sample of ultra-cool dwarf stars observed from the ground. We estimated the total number of ultra-cool dwarfs per spectral subtype, and obtained values that can be summarised (in orders of magnitude) as 400000 objects in the M5-L0 range, 600 objects between L0 and L5, 30 objects between L5 and T0, and 10 objects between T0 and T8. A bright ultra-cool dwarf (with teff=2500 K and \logg=3.5 will be detected by Gaia out to approximately 220 pc, while for teff=1500 K (spectral type L5) and the same surface gravity, this maximum distance reduces to 10-20 pc. The RMSE of the prediction deduced from ground-based spectra of ultra-cool dwarfs simulated at the Gaia spectral range and resolution, and for a Gaia magnitude G=20 is 213 K and 266 K for the models based on k-nearest neighbours and Gaussian process regression, respectively. These are total errors in the sense that they include the internal and external errors, with the latter caused by the inability of the synthetic spectral models (used for the construction of the regression models) to exactly reproduce the observed spectra, and by the large uncertainties in the current calibrations of spectral types and effective temperatures.Comment: 18 pages, 17 figures, accepted by Astronomy & Astrophysic

    RR Lyrae stars as standard candles in the Gaia Data Release 2 Era

    Get PDF
    We present results from the analysis of 401 RR Lyrae stars (RRLs) belonging to the field of the Milky Way (MW). For a fraction of them multi-band (VV, KsK_{\rm s}, W1W1) photometry, metal abundances, extinction values and pulsation periods are available in the literature and accurate trigonometric parallaxes measured by the Gaia mission alongside Gaia GG-band time-series photometry have become available with the Gaia second data release (DR2) on 2018 April 25. Using a Bayesian fitting approach we derive new near-, mid-infrared period-absolute magnitude-metallicity (PMZPMZ) relations and new absolute magnitude-metallicity relations in the visual (MV[Fe/H]M_V - {\rm [Fe/H]}) and GG bands (MG[Fe/H]M_G - {\rm [Fe/H]}), based on the Gaia DR2 parallaxes. We find the dependence of luminosity on metallicity to be higher than usually found in the literature, irrespective of the passband considered. Running the adopted Bayesian model on a simulated dataset we show that the high metallicity dependence is not caused by the method, but likely arises from the actual distribution of the data and the presence of a zero-point offset in the Gaia parallaxes. We infer a zero-point offset of 0.057-0.057 mas, with the Gaia DR2 parallaxes being systematically smaller. We find the RR Lyrae absolute magnitude in the VV, GG, KsK_{\rm s} and W1W1 bands at metallicity of [Fe/H]=1.5-1.5 dex and period of P = 0.5238 days, based on Gaia DR2 parallaxes to be MV=0.66±0.06M_V = 0.66\pm0.06 mag, MG=0.63±0.08M_G = 0.63\pm0.08 mag, MKs=0.37±0.11M_{K_{\rm s}} = -0.37\pm0.11 mag and MW1=0.41±0.11M_{W1} = -0.41\pm0.11 mag, respectively.Comment: 18 pages, 21 figures, 4 tables. Accepted for publication in MNRA

    Gaia Data Release 2: using Gaia parallaxes

    Get PDF
    Context. The second Gaia.data release (Gaia DR2 ) provides precise five-parameter astrometric data (positions, proper motions and parallaxes) for an unprecendented amount of sources (more than 1.3 billion, mostly stars). This new wealth of data will enable the undertaking of statistical analyses of many astrophysical problems that were previously unfeasible for lack of reliable astrometry, and in particular because of the lack of parallaxes. But the use of this wealth of astrometric data comes with a specific challenge: how does one properly infer from these data the astrophysical parameters of interest? Aims. The main - but not only - focus of this paper is the issue of the estimation of distances from parallaxes, possibly combined with other information. We start with a critical review of the methods traditionally used to obtain distances from parallaxes and their shortcomings. Then we provide guidelines on how to use parallaxes more efficiently to estimate distances by using Bayesian methods. In particular also we show that negative parallaxes, or parallaxes with relatively larger uncertainties still contain valuable information. Finally, we provide examples that show more generally how to use astrometric data for parameter estimation, including the combination of proper motions and parallaxes and the handling of covariances in the uncertainties. Methods. The paper contains examples based on simulated Gaia data to illustrate the problems and the solutions proposed. Furthermore, the developments and methods proposed in the paper are linked to a set of tutorials included in the Gaia archive documentation that provide practical examples and a good starting point for the application of the recommendations to actual problems. In all cases the source code for the analysis methods is provided. Results. Our main recommendation is to always treat the derivation of (astro-) physical parameters from astrometric data, in particular when parallaxes are involved, as an inference problem which should preferably be handled with a full Bayesian approach. Conclusions. Gaia will provide fundamental data for many fields of astronomy. Further data releases will provide more and more precise data. Nevertheless, for full use of the potential it will always be necessary to pay careful attention to the statistical treatment of parallaxes and proper motions. The purpose of this paper is to help astronomers finding the correct approach

    From Data to Software to Science with the Rubin Observatory LSST

    Full text link
    The Vera C. Rubin Observatory Legacy Survey of Space and Time (LSST) dataset will dramatically alter our understanding of the Universe, from the origins of the Solar System to the nature of dark matter and dark energy. Much of this research will depend on the existence of robust, tested, and scalable algorithms, software, and services. Identifying and developing such tools ahead of time has the potential to significantly accelerate the delivery of early science from LSST. Developing these collaboratively, and making them broadly available, can enable more inclusive and equitable collaboration on LSST science. To facilitate such opportunities, a community workshop entitled "From Data to Software to Science with the Rubin Observatory LSST" was organized by the LSST Interdisciplinary Network for Collaboration and Computing (LINCC) and partners, and held at the Flatiron Institute in New York, March 28-30th 2022. The workshop included over 50 in-person attendees invited from over 300 applications. It identified seven key software areas of need: (i) scalable cross-matching and distributed joining of catalogs, (ii) robust photometric redshift determination, (iii) software for determination of selection functions, (iv) frameworks for scalable time-series analyses, (v) services for image access and reprocessing at scale, (vi) object image access (cutouts) and analysis at scale, and (vii) scalable job execution systems. This white paper summarizes the discussions of this workshop. It considers the motivating science use cases, identified cross-cutting algorithms, software, and services, their high-level technical specifications, and the principles of inclusive collaborations needed to develop them. We provide it as a useful roadmap of needs, as well as to spur action and collaboration between groups and individuals looking to develop reusable software for early LSST science.Comment: White paper from "From Data to Software to Science with the Rubin Observatory LSST" worksho

    Infected pancreatic necrosis: outcomes and clinical predictors of mortality. A post hoc analysis of the MANCTRA-1 international study

    Get PDF
    : The identification of high-risk patients in the early stages of infected pancreatic necrosis (IPN) is critical, because it could help the clinicians to adopt more effective management strategies. We conducted a post hoc analysis of the MANCTRA-1 international study to assess the association between clinical risk factors and mortality among adult patients with IPN. Univariable and multivariable logistic regression models were used to identify prognostic factors of mortality. We identified 247 consecutive patients with IPN hospitalised between January 2019 and December 2020. History of uncontrolled arterial hypertension (p = 0.032; 95% CI 1.135-15.882; aOR 4.245), qSOFA (p = 0.005; 95% CI 1.359-5.879; aOR 2.828), renal failure (p = 0.022; 95% CI 1.138-5.442; aOR 2.489), and haemodynamic failure (p = 0.018; 95% CI 1.184-5.978; aOR 2.661), were identified as independent predictors of mortality in IPN patients. Cholangitis (p = 0.003; 95% CI 1.598-9.930; aOR 3.983), abdominal compartment syndrome (p = 0.032; 95% CI 1.090-6.967; aOR 2.735), and gastrointestinal/intra-abdominal bleeding (p = 0.009; 95% CI 1.286-5.712; aOR 2.710) were independently associated with the risk of mortality. Upfront open surgical necrosectomy was strongly associated with the risk of mortality (p < 0.001; 95% CI 1.912-7.442; aOR 3.772), whereas endoscopic drainage of pancreatic necrosis (p = 0.018; 95% CI 0.138-0.834; aOR 0.339) and enteral nutrition (p = 0.003; 95% CI 0.143-0.716; aOR 0.320) were found as protective factors. Organ failure, acute cholangitis, and upfront open surgical necrosectomy were the most significant predictors of mortality. Our study confirmed that, even in a subgroup of particularly ill patients such as those with IPN, upfront open surgery should be avoided as much as possible. Study protocol registered in ClinicalTrials.Gov (I.D. Number NCT04747990)

    From Data to Software to Science with the Rubin Observatory LSST

    Full text link
    editorial reviewedThe Vera C. Rubin Observatory Legacy Survey of Space and Time (LSST) dataset will dramatically alter our understanding of the Universe, from the origins of the Solar System to the nature of dark matter and dark energy. Much of this research will depend on the existence of robust, tested, and scalable algorithms, software, and services. Identifying and developing such tools ahead of time has the potential to significantly accelerate the delivery of early science from LSST. Developing these collaboratively, and making them broadly available, can enable more inclusive and equitable collaboration on LSST science. To facilitate such opportunities, a community workshop entitled "From Data to Software to Science with the Rubin Observatory LSST" was organized by the LSST Interdisciplinary Network for Collaboration and Computing (LINCC) and partners, and held at the Flatiron Institute in New York, March 28-30th 2022. The workshop included over 50 in-person attendees invited from over 300 applications. It identified seven key software areas of need: (i) scalable cross-matching and distributed joining of catalogs, (ii) robust photometric redshift determination, (iii) software for determination of selection functions, (iv) frameworks for scalable time-series analyses, (v) services for image access and reprocessing at scale, (vi) object image access (cutouts) and analysis at scale, and (vii) scalable job execution systems. This white paper summarizes the discussions of this workshop. It considers the motivating science use cases, identified cross-cutting algorithms, software, and services, their high-level technical specifications, and the principles of inclusive collaborations needed to develop them. We provide it as a useful roadmap of needs, as well as to spur action and collaboration between groups and individuals looking to develop reusable software for early LSST science

    Classification of variable stars in the WFCAM Transit Survey

    No full text
    The WFCAM Transit Survey is a photometric survey in the near-infrared and aims at finding Earth-like planets transiting M-dwarf stars. As a by-product of the survey, a variety of variable stars has been detected. We report the discovery and classification of 192 periodic variable stars in the WFCAM Transit Survey. 185 of those objects are previously unknown variable sources. The derived parameters of their light curves will be helpful for the creation of a robust sample of light curves (and their parameters thereof) of classified variable stars in the near-infrared for the automatic classification of light curves of stellar objects in the J-band
    corecore